Exact Top-k Feature Selection via l2, 0-Norm Constraint
نویسندگان
چکیده
In this paper, we propose a novel robust and pragmatic feature selection approach. Unlike those sparse learning based feature selection methods which tackle the approximate problem by imposing sparsity regularization in the objective function, the proposed method only has one `2,1-norm loss term with an explicit `2,0-Norm equality constraint. An efficient algorithm based on augmented Lagrangian method will be derived to solve the above constrained optimization problem to find out the stable local solution. Extensive experiments on four biological datasets show that although our proposed model is not a convex problem, it outperforms the approximate convex counterparts and state-ofart feature selection methods evaluated in terms of classification accuracy by two popular classifiers. What is more, since the regularization parameter of our method has the explicit meaning, i.e. the number of feature selected, it avoids the burden of tuning the parameter, making it a pragmatic feature selection method.
منابع مشابه
Primal-Dual Framework for Feature Selection using Least Squares Support Vector Machines
Least Squares Support Vector Machines (LSSVM) perform classification using L2-norm on the weight vector and a squared loss function with linear constraints. The major advantage over classical L2-norm support vector machine (SVM) is that it solves a system of linear equations rather than solving a quadratic programming problem. The L2norm penalty on the weight vectors is known to robustly select...
متن کاملEffective Discriminative Feature Selection with Non-trivial Solutions
Feature selection and feature transformation, the two main ways to reduce dimensionality, are often presented separately. In this paper, a feature selection method is proposed by combining the popular transformation based dimensionality reduction method Linear Discriminant Analysis (LDA) and sparsity regularization. We impose row sparsity on the transformation matrix of LDA through l2,1-norm re...
متن کامل$l_{2,p}$ Matrix Norm and Its Application in Feature Selection
Recently, l2,1 matrix norm has been widely applied to many areas such as computer vision, pattern recognition, biological study and etc. As an extension of l1 vector norm, the mixed l2,1 matrix norm is often used to find jointly sparse solutions. Moreover, an efficient iterative algorithm has been designed to solve l2,1-norm involved minimizations. Actually, computational studies have showed th...
متن کاملFeature Selection at the Discrete Limit
Feature selection plays an important role in many machine learning and data mining applications. In this paper, we propose to use L2,p norm for feature selection with emphasis on small p. As p → 0, feature selection becomes discrete feature selection problem. We provide two algorithms, proximal gradient algorithm and rankone update algorithm, which is more efficient at large regularization λ. W...
متن کاملA Novel Nonnegative Subspace Learning Approach for Unsupervised Feature Selection
* School of Computer Engineering, Jinling Institute of Technology Nanjing 211169, China, ([email protected]) Abstract Sparse subspace learning has been proven to be effective in data mining and machine learning. In this paper, we propose a novel scheme which performs robust feature selection with non-negative constraint and sparse subspace learning simultaneously. This work emphasizes joint l2...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2013